A Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
نویسندگان
چکیده مقاله:
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution (DE) is utilized as the global search. The main contributions of this paper are: (i) Proposing a new fast and effective local search based on spatial distribution (that is named Spatial Distribution Local Search (SDLS)), SDLS can adjust the step size of parameters adaptively toward obtaining the better ones. (ii) Introducing an adaptive selection method to select appropriate individuals from current population for local refinement in MA. (iii) Improving the selection operator in standard DE by an adaptive strategy. In this strategy, worse offspring has a chance to be replaced with its parent to prevent trapping in local optima and controlling the selection pressure. The proposed MA is compared with several training algorithms of FWNNs over some benchmark problems. Experimental results obtained, confirm the effectiveness of the proposed MA for improving the convergence rate and modeling accuracy in comparison to the other training methods.
منابع مشابه
A conjugate gradient based method for Decision Neural Network training
Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore,...
متن کاملArtificial neural network regression as a local search heuristic for ensemble strategies in differential evolution
Nature frequently serves as an inspiration for developing new algorithms to solve challenging real-world problems.Mathematicalmodeling has led to the development of artificial neural networks (ANNs), which have proven especially useful for solving problems such as classification and regression. Moreover, evolutionary algorithms (EAs), inspired by Darwin’s natural evolution, have been successful...
متن کاملImage Backlight Compensation Using Recurrent Functional Neural Fuzzy Networks Based on Modified Differential Evolution
In this study, an image backlight compensation method using adaptive luminance modification is proposed for efficiently obtaining clear images.The proposed method combines the fuzzy C-means clustering method, a recurrent functional neural fuzzy network (RFNFN), and a modified differential evolution.The proposed RFNFN is based on the two backlight factors that can accurately detect the compensat...
متن کاملA Novel Differential Evolution Based Algorithm for Higher Order Neural Network Training
In this paper, an application of an adaptive differential evolution (DE) algorithm for training higher order neural networks (HONNs), especially the Pi-Sigma Network (PSN) has been introduced. The proposed algorithm is a variant of DE/rand/2/bin and possesses two modifications to avoid the shortcomings of DE/rand/2/bin. The base vector for perturbation is the best vector out of the three random...
متن کاملA New Differential Evolution Algorithm with Alopex-Based Local Search
Differential evolution (DE), as a class of biologically inspired and meta-heuristic techniques, has attained increasing popularity in solving many real world optimization problems. However, DE is not always successful. It can easily get stuck in a local optimum or an undesired stagnation condition. This paper proposes a new DE algorithm Differential Evolution with Alopex-Based Local Search (DEA...
متن کاملA Bayesian Local Linear Wavelet Neural Network
In general, wavelet neural networks have a problem on the curse of dimensionality, i.e. the number of hidden units to be required are exponentially rose with increasing an input dimension. To solve the above problem, a wavelet neural network incorporating a local linear model has already been proposed. On their network design, however, the number of hidden units is empirically determined and fi...
متن کاملمنابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ذخیره در منابع من قبلا به منابع من ذحیره شده{@ msg_add @}
عنوان ژورنال
دوره 27 شماره 8
صفحات 1185- 1194
تاریخ انتشار 2014-08-01
با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.
میزبانی شده توسط پلتفرم ابری doprax.com
copyright © 2015-2023